tokenized - definition. What is tokenized
Diclib.com
قاموس على الإنترنت

%ما هو (من)٪ 1 - تعريف

COMPUTING PROCESS OF PARSING A SEQUENCE OF CHARACTERS INTO A SEQUENCE OF TOKENS
Lexical Analysis; Lexical analyzer; Token (parser); Lexer; Lexical token; Lexical analyser; Scanner (computing); Tokenize; Lexing; Tokenise; Tokenized; Tokenizing; Lexical parser; Tokenizer; Tokeniser; Tokenization (lexical analysis); Token splitting; Token scanner; Lexer generator; Lexer (computer science); Semicolon insertion; List of lexer generators; Lexical syntax; Lexeme (computer science); Automatic semicolon insertion; Lexers

Lexical analysis         
In computer science, lexical analysis, lexing or tokenization is the process of converting a sequence of characters (such as in a computer program or web page) into a sequence of lexical tokens (strings with an assigned and thus identified meaning). A program that performs lexical analysis may be termed a lexer, tokenizer, or scanner, although scanner is also a term for the first stage of a lexer.
lexical analyser         
<language> (Or "scanner") The initial input stage of a language processor (e.g. a compiler), the part that performs lexical analysis. (1995-04-05)
tokenism         
UNFAIR PRACTICE INVOLVING MINORITIES
Token wife; Token black guy; Token character; Token minority; Token negro; Token woman; Token gesture; Tokenist
If you refer to an action as tokenism, you disapprove of it because you think it is just done for effect, in order to show a particular intention or to impress a particular type of person.
Is his promotion evidence of the minorities' advance, or mere tokenism?
N-UNCOUNT [disapproval]

ويكيبيديا

Lexical analysis

In computer science, lexical analysis, lexing or tokenization is the process of converting a sequence of characters (such as in a computer program or web page) into a sequence of lexical tokens (strings with an assigned and thus identified meaning). A program that performs lexical analysis may be termed a lexer, tokenizer, or scanner, although scanner is also a term for the first stage of a lexer. A lexer is generally combined with a parser, which together analyze the syntax of programming languages, web pages, and so forth.

أمثلة النطق لـ٪ 1
1. and so many assets that could be tokenized.
The Blockchain and The New Architecture of Trust _ Kevin Werbach _ Talks at Google
2. I said, I don't want to be tokenized or have a lesser
Accessibility in the Workplace _ Paul Artale _ Talks at Google